ASU entrepreneurs develop smart street cameras | ASU News

2022-07-23 05:26:16 By : Ms. Cara Yang

It’s said that nothing is certain, except death and taxes. Let’s add a third certainty to that list: traffic.

All across the globe, traffic engineers and city planners are locked in an eternal struggle to improve the flow of traffic, the efficiency of streets and the safety of pedestrians, cyclists and drivers. Finding the best way to meet these goals requires an enormous amount of data, which is often difficult to collect and analyze.

Two Arizona State University entrepreneurs are making this data easier to understand and access. Mohammad Farhadi and Yezhou Yang founded Argos Vision, a tech startup developing smart traffic cameras that can passively capture, analyze and deliver droves of data to help cities improve road safety and efficiency.

Argos Vision emerged from Farhadi and Yang’s work as researchers in the School of Computing and Augmented Intelligence, one of the Ira A. Fulton Schools of Engineering. Yang, an assistant professor of computer science and engineering and director of the Active Perception Group, advised Farhadi as he pursued a doctorate in computer science. Farhadi earned his doctoral degree in spring 2022.

The pair created a self-contained, solar-powered traffic camera that uses on-board computer vision, a type of artificial intelligence, to identify and classify what it sees.

“We identified three major things we wanted to accomplish with this technology,” Farhadi says. “Cost reduction, privacy protection and rich metadata extraction.”

Installing traffic cameras can be costly to local governments. Closing intersections to add new power and network cable to existing infrastructure is a lengthy and expensive process. Argos Vision solves this financial roadblock with a self-contained camera system that runs off solar power and transmits data over a cellular network.

“We want to extract rich data that meets not only the minimum desire of cities, such as vehicle counting, but data that can be used in the future as well,” Farhadi says.

Named for the many-eyed giant of Greek myth, the Argos algorithm can also capture detailed contextual information, including type of vehicle, dimensions, color and markings. It can also develop a 3D model of vehicles for future reference.

Distinguishing vehicle type could be helpful for road maintenance. Roads degrade at different rates depending on their use, and understanding which vehicles use which roads at high rates may help cities better allocate resources and predict where preventative maintenance is most needed. For example, an Argos camera might observe large trucks commonly using a shortcut to access an industrial area.

“At that location, a city might elect to reinforce a road so they don’t have to replace it every year,” Farhadi says.

Despite the detailed information the Argos Vision technology collects, it does not employ any facial recognition or collect identifying information to protect the privacy of everyone on the road.

Argos extracts detailed information using a novel software framework developed by Farhadi. As the Argos cameras take images, a neural network analyzes the images’ content and distills it into its component parts. Much like how our brains can quickly distinguish what we see into separate parts — person, dog on a leash, bus stop — a neural network uses a similar process to contextualize information.

Traditionally, neural networks are computationally and power intensive, especially on small devices such as cameras. But Argos Vision’s software allows their neural network to run on low power and provide real-time traffic monitoring that collects incredibly detailed data, says Yang.

Say a city wants to figure out why the intersection of Main Street and First Avenue is frequently congested. The city might send someone to observe traffic, or put down road sensors to count cars, or use mobile phone sensors to estimate the number of drivers in the area.

Argos Vision's test camera overlooks the intersection of Mill Avenue and Seventh Street in Tempe, Arizona. The self-contained, solar-powered traffic camera uses on-board artificial intelligence to identify and classify what it sees, providing a persistent stream of useful information for city planners and engineers. Photo courtesy Argos Vision

The problem with these methods is that the data collected is imprecise. Human observation only offers a snapshot of traffic and is prone to error. Road sensors don’t differentiate between buses, cars or emergency vehicles. Mobile data can’t tell whether 15 phone signals passing through an intersection represent 15 drivers or a mix of drivers, bus riders and pedestrians.

“This doesn’t give you a clear picture, because these are snapshots of data. Traffic has a dynamic nature,” Farhadi says. “The beauty of using a computer vision-based system like ours is that it gives cities a permanent, precise flow of information.”

Yang and Farhadi also see potential for the Argos system to augment and improve the function of autonomous vehicles.

“We can provide autonomous vehicles with situational awareness of other vehicles or pedestrians outside the scope of their on-board sensors,” Yang says. “Also, our rich metadata could help local authorities measure how safe the AVs are while operating on public roads.”

“Many of these research ideas, I have to attribute to Mohammed, thanks to his constant exploration of what is possible,” Yang says.

The permanent flow of data supplied by Argos cameras can help cities evaluate more than just motor vehicle traffic. It could also help policymakers and city planners improve safety for all road users.

“Pedestrians are a big factor in street traffic,” Farhadi says. “Arizona has one of the highest pedestrian fatality rates, and we want to understand why that is happening and how to prevent it.”

Argos cameras will be lending its vision to Arizona streets starting this summer, helping improve road safety for all users.

In partnership with the city of Phoenix Street Transportation Department, Argos Vision cameras will be installed at the intersections of Third Avenue and Adams Street and First and Taylor streets for a one-year pilot program.

Both downtown locations — near City Hall and ASU’s Downtown Phoenix campus, respectively — were chosen for their high pedestrian activity, says Simon T. Ramos, a traffic management and operations engineer in the Phoenix Street Transportation Department.

Along with collecting standard traffic information, like number of vehicles, pedestrians and cyclists, the Argos camera will be cataloging near-miss data.

“Say there's a close call, where a vehicle crosses the path of a pedestrian. We can identify these conflict hot spots,” Ramos says.

Through its persistent monitoring and evaluation, Argos’ data will identify conflict areas between vehicles, bicycles and pedestrians. Ramos and his department can use the near-miss data to then develop tailored safety measures to mitigate such conflicts, such as changing signal timing or the visible markings on the road.

This effort aligns with Phoenix’s plan to incorporate Vision Zero principles into its Road Safety Action Plan. Vision Zero — a strategy to eliminate traffic fatalities and increase mobility within urban areas — was adopted by Phoenix City Council in early 2022, joining more than 40 other U.S. communities striving for safer, more equitable roadways.

The city already has an array of traffic cameras collecting data, but Argos provides a more cost-effective alternative than existing systems.

“What really kind of drew our attention to this specific technology was it is economically cheaper than the competition,” Ramos says. “Phoenix is committed to working smarter and spending wisely, and it’s an ongoing effort to identify technologies to improve travel times and reduce congestion and accidents.”

The Argos Vision team is looking forward to contributing to the city’s goals while refining their technology.

“Together with the city, we are excited to bring advanced AI technologies from ASU onto Arizona roads for social good,” Yang says.

Farhadi and Yang’s collaboration goes back to 2016, when both were newcomers to ASU.

“The school organized a student recruitment session, and I brought a poster of my research,” Yang says. “Four or five people stopped by, but Mohammad was the only person who was interested.”

Combining Yang’s expertise in computer vision and Farhadi’s background in hardware acceleration and computer networks, Argos Vision was born. When they began looking for the most lucrative use of their technology, they first landed on shopping malls.

“We focused on tracking the movement and amount of people to improve the HVAC efficiency in a retail area,” Farhadi says.

However, they found this route to be a dead end. Not only were a lot of competitors pursing this application, but stores simply weren’t willing to justify the installation cost to save on heating and cooling. Retailers also wanted a system that could tell them more about their customers.

“We couldn’t tell you everything about somebody,” says Ryan Kemmet, Argos’ business and legal adviser. “We don’t have facial recognition and we can’t link people to their Facebook account or anything.”

Kemmet was drawn into the Argos orbit when Farhadi and Yang joined the National Science Foundation Innovation Corps Site at ASU (NSF I-Corps). The five-week training program, led by the J. Orin Edson Entrepreneurship + Innovation Institute, includes entrepreneurial training, industry mentorship and financial support for researchers looking to commercialize their technology research.

Kemmet served as Argos’ industry mentor during their ASU I-Corps participation, which serves as a springboard for the nationwide NSF I-Corps program. After completing the ASU program, they were selected to continue onto the national version.

“It’s quite an intensive program,” Kemmet says. “We went through some initial ideas of what we thought the applications of this technology would be, but it was the work in the national I-Corps program that helped us define the beachhead application for this technology.”

Left to right: Yezhou Yang, Mohammad Farhadi and Ryan Kemmet of Argos Vision. Photo courtesy Argos Vision

I-Corps, along with Farhadi and Yang’s professional experience and interests, ultimately led Argos to traffic monitoring. Farhadi learned about the growing need for active traffic monitoring during a 2020 summer internship with the Ford Motor Company. Yang saw the potential from his work with the Institute of Automated Mobility, which brings together academia, government and industry to develop a safe, efficient ecosystem to support testing and adoption of autonomous vehicles in Arizona.

Prior to participating in I-Corps, Yang and Farhadi participated in a number of Edson Entrepreneurship + Innovation Institute programs to strengthen their venture and connect to resources and entrepreneurial communities. 

Argos joined Edson E+I Institute’s Venture Devils in 2020. The program provides mentorship and support to fledgling businesses, social enterprises and nonprofits founded by ASU students, faculty, staff and local community members with ties to ASU. The program includes an opportunity to participate in Demo Day, a biannual pitch competition where Venture Devils startups make their case for investment to a range of funding sources. In the fall 2021 Demo Day, Argos secured $6,500 in funding.

They also enrolled in the National Security Academic Accelerator (NSA2) to explore the national security applications of their technology. A partnership between Edson E+I and  the National Security Innovation Network, NSA2 creates connections between ASU-led ventures and Department of Defense representatives and opportunities, as well as providing tailored training and mentorship. NSA2 was instrumental was helping Argos navigate the complexities of assembling a proposal for a Small Business Innovation Research award with the Department of Transportation.

“It’s a powerful resource,” says Farhadi of Edson E+I. “Coming from Iran, I had entrepreneurial experience, but the U.S. has a totally different culture, totally different business landscape. Edson E+I has connected us with the right people, like Ryan, and really propelled Argos Vision.”

In Iran, Farhadi ran a business providing internet-based phone service and network security to remote regions. He watched his father found and operate a telecom company from a young age, which left an impression on him.

“Iran is a consumer country; most of the time, technology is imported from elsewhere,” he says. “But when my father starting selling his devices in-country, suddenly there was trust in a local company. That’s something I’ve tried to pursue in my life — people trusting your work.”

Despite entrepreneurship being a family tradition, starting a company wasn’t on his mind when he came to the U.S. to study. However, Farhadi relishes the opportunity to forge his own path.

“When you work at a company, you work within someone else’s system, you have specific goals that are assigned to you. You might be able to achieve them however you want, but they aren’t your goals,” Farhadi says. “As an entrepreneur, you create your own system. You set your own goals.”

Yang, recently named a Fulton Entrepreneurial Professor, says Edson E+I resources and programs are preparing entrepreneurs in AI like himself and Farhadi for very timely opportunities.

“As a professor in AI, I wouldn’t have been interested in entrepreneurship 20 or 30 years ago. The technology was just not ready,” he says. “Right now, we’re at a very special time, where the technology is maturing and the market is very hungry for real-world applications. So having the connections and resources facilitated by ASU and Edson E+I to find those applications has been very helpful.”

Top photo illustration by Travis Buckner

Assistant director of content strategy , Knowledge Enterprise

480-727-5631 pzrioka@asu.edu

An Arizona State University professor was part of a team whose research may have discovered how cancer cells suppress the body’s production of insulin, leading to a higher risk of diabetes for women who have had breast cancer.Dorothy Sears, a professor of nutrition in the College of Health Solutions, was part of a group of researchers whose study was recently published in the journal Nature Cell...

An Arizona State University professor was part of a team whose research may have discovered how cancer cells suppress the body’s production of insulin, leading to a higher risk of diabetes for women who have had breast cancer.

Dorothy Sears, a professor of nutrition in the College of Health Solutions, was part of a group of researchers whose study was recently published in the journal Nature Cell Biology. She researches obesity and the risk for obesity-related diseases such as insulin resistance, diabetes, cardiovascular disease and cancer.

“We know that people with cancer have an increased risk of diabetes – that’s been researched a lot,” she said.

“But what hasn’t been researched was how does cancer increase the risk for diabetes? The key finding of this study was the mechanism by which the cancer is increasing the risk for diabetes and exactly how that happens.

“It was an elegant series of experiments,” she said of the study. The team, in addition to Sears, one of the supervisors of the research, included scientists from the University of California at San Diego, City of Hope, Regulus Therapeutics and the University of California at Riverside.

Dorothy Sears, a professor of nutrition in the College of Health Solutions, is currently analyzing the data from a study that looks at how the built environment affects people's risk of cancer.

The study found that the mechanism that connects breast cancer to diabetes are “extracellular vesicles.” Breast cancer cells shed these vesicles, which are hollow spheres that carry DNA, RNA, fats and proteins among cells. The breast cancer cells secrete molecules called microRNA-122 into the vesicles, which then travel to the pancreas and latch onto insulin-producing cells. There, the microRNA-122 damages the pancreatic cells’ ability to secrete insulin and maintain normal blood glucose levels.

That leads to higher blood glucose levels in breast cancer patients, elevating their risk of diabetes.

The initial research was done with mouse models.

“What was missing and where I came in was showing that elevation of these microRNAs was really existing in humans,” Sears said.

Shizhen Emily Wang, professor of pathology at UC San Diego School of Medicine, had serum samples from breast cancer patients, but not from women who were cancer-free.

“She needed serum from control women to show that these microRNAs were elevated in the cancer patients, just like in the mice,” Sears said.

Sears had many serum samples in her freezer from women who were cancer-free, and she was able to match their characteristics, such as age and body mass index, to Wang’s samples of people who had cancer.

"Then (Wang) was able to measure the concentration of microRNA in her breast cancer population versus my cancer-free population and show that the hers had more microRNA-122," Sears said.

“Then, she took the microRNA-122 from her population and put it on beta cells in culture and showed that the microRNA-122 from the patients did the same things in the culture as it did in the mouse model, which was to inhibit insulin secretion.”

The good thing is that, in theory, with this kind of information, we can say "Aha, I see you microRNA-122, and I can block you."

— Dorothy Sears, professor of nutrition in the College of Health Solutions

Researchers knew that extracellular vesicles could mediate cell-to-cell communication.

“They’re bubbling off and getting into the blood, which they have to get through barriers to do, and then they’re fusing to the recipient cells and delivering little envelopes of stuff to those target cells," Sears said.

“In each vesicle, there can be hundreds of microRNAs and they can deliver a lot of material to cells over time.

“MicroRNA-122 are powerful little guys that inhibit gene expression.”

Tumors grow quickly, using sugar as fuel, Sears said.

“So it’s in their best interest to impair insulin secretion, because if that happens, the blood glucose level rises, and now they have more fuel so they can grow faster,” she said.

“The good thing is that, in theory, with this kind of information, we can say ‘Aha, I see you microRNA-122, and I can block you.’”

Several clinical trials are underway now of anti-microRNAs.

“They’re treating microRNAs with another microRNA, so you’re velcroing it to itself,” Sears said.

She had the samples because of her current research study that is analyzing biomarkers in people to determine how their exposure to the built environment influences their risk of cancer.

“The built environment might be the walkability of a neighborhood. Can they walk to the market or do they have to get into a car? And do they live near areas where there’s lots of pollution, like near a freeway? And are they near green space or the ocean or water-treatment plants?

“Air pollution exposure impacts insulin resistance, which is a precursor for diabetes and obesity and cancer.”

Sears worked at UC San Diego when she started working on the study, which recruited more than 600 people and collected blood, urine and stool samples. The participants also answered a questionnaire and wore GPS devices for two weeks. She’s still analyzing the results.

The participants agreed to allow their samples to be banked, so that’s why they were available for the microRNA-122 study.

“We collected a ton of data, and I have students at ASU working on many aspects of the study,” she said.

480-727-4503 marybeth.faller@asu.edu